Goto

Collaborating Authors

 flag manifold


Nested subspace learning with flags

Szwagier, Tom, Pennec, Xavier

arXiv.org Machine Learning

Many machine learning methods look for low-dimensional representations of the data. The underlying subspace can be estimated by first choosing a dimension $q$ and then optimizing a certain objective function over the space of $q$-dimensional subspaces (the Grassmannian). Trying different $q$ yields in general non-nested subspaces, which raises an important issue of consistency between the data representations. In this paper, we propose a simple trick to enforce nestedness in subspace learning methods. It consists in lifting Grassmannian optimization problems to flag manifolds (the space of nested subspaces of increasing dimension) via nested projectors. We apply the flag trick to several classical machine learning methods and show that it successfully addresses the nestedness issue.


Fun with Flags: Robust Principal Directions via Flag Manifolds

Mankovich, Nathan, Camps-Valls, Gustau, Birdal, Tolga

arXiv.org Machine Learning

Principal component analysis (PCA), along with its extensions to manifolds and outlier contaminated data, have been indispensable in computer vision and machine learning. In this work, we present a unifying formalism for PCA and its variants, and introduce a framework based on the flags of linear subspaces, \ie a hierarchy of nested linear subspaces of increasing dimension, which not only allows for a common implementation but also yields novel variants, not explored previously. We begin by generalizing traditional PCA methods that either maximize variance or minimize reconstruction error. We expand these interpretations to develop a wide array of new dimensionality reduction algorithms by accounting for outliers and the data manifold. To devise a common computational approach, we recast robust and dual forms of PCA as optimization problems on flag manifolds. We then integrate tangent space approximations of principal geodesic analysis (tangent-PCA) into this flag-based framework, creating novel robust and dual geodesic PCA variations. The remarkable flexibility offered by the 'flagification' introduced here enables even more algorithmic variants identified by specific flag types. Last but not least, we propose an effective convergent solver for these flag-formulations employing the Stiefel manifold. Our empirical results on both real-world and synthetic scenarios, demonstrate the superiority of our novel algorithms, especially in terms of robustness to outliers on manifolds.


Chordal Averaging on Flag Manifolds and Its Applications

Mankovich, Nathan, Birdal, Tolga

arXiv.org Artificial Intelligence

This paper presents a new, provably-convergent algorithm for computing the flag-mean and flag-median of a set of points on a flag manifold under the chordal metric. The flag manifold is a mathematical space consisting of flags, which are sequences of nested subspaces of a vector space that increase in dimension. The flag manifold is a superset of a wide range of known matrix spaces, including Stiefel and Grassmanians, making it a general object that is useful in a wide variety computer vision problems. To tackle the challenge of computing first order flag statistics, we first transform the problem into one that involves auxiliary variables constrained to the Stiefel manifold. The Stiefel manifold is a space of orthogonal frames, and leveraging the numerical stability and efficiency of Stiefel-manifold optimization enables us to compute the flag-mean effectively. Through a series of experiments, we show the competence of our method in Grassmann and rotation averaging, as well as principal component analysis. We release our source code under https://github.com/nmank/FlagAveraging.


Closed-form geodesics and trust-region method to calculate Riemannian logarithms on Stiefel and its quotient manifolds

Nguyen, Du

arXiv.org Machine Learning

We provide two closed-form geodesic formulas for a family of metrics on Stiefel manifold, parameterized by two positive numbers, having both the embedded and canonical metrics as special cases. The closed-form formulas allow us to compute geodesics by matrix exponential in reduced dimension for low-rank manifolds. Combining with the use of Fr{\'e}chet derivatives to compute the gradient of the square Frobenius distance between a geodesic ending point to a given point on the manifold, we show the logarithm map and geodesic distance between two endpoints on the manifold could be computed by {\it minimizing} this square distance by a {\it trust-region} solver. This leads to a new framework to compute the geodesic distance for manifolds with known geodesic formula but no closed-form logarithm map. We show the approach works well for Stiefel as well as flag manifolds. The logarithm map could be used to compute the Riemannian center of mass for these manifolds equipped with the above metrics. We also deduce simple trigonometric formulas for the Riemannian exponential and logarithm maps on the Grassmann manifold.